153 research outputs found

    How Principals Bridge and Buffer the New Demands of Teacher Quality and Accountability: A Mixed-Methods Analysis of Teacher Hiring

    Get PDF
    In this mixed-methods study, we examine the degree to which district-and building-level administrators accommodate teacher-quality and test-based accountability policies in their hiring practices. We find that administrators negotiated local hiring goals with characteristics emphasized by federal and state teacher-quality policies, such as knowledge of the subject and teaching skills. While district administrators and principals largely bridged to external certification requirements, some principals buffered their hiring decisions from the pressures of test-based accountability. Principals who bridged to test-based accountability gave greater weight to subject knowledge and teaching skills. We find that bridging and buffering differs by policy and cannot be easily applied to accountability policies. Specifically, separating the indirect effect of external accountability from other policies influencing principal hiring is difficult. Our analysis also highlights tensions among local, state, and federal policies regarding teacher quality and the potential of accountability to permeate noninstructional school decision making

    Independent component analysis for the identification of sources of variation on an industrial nirs application

    Full text link
    A Near Infrared Spectroscopy (NIRS) industrial application was developed by the LPF-Tagralia team, and transferred to a Spanish dehydrator company (Agrotécnica Extremeña S.L.) for the classification of dehydrator onion bulbs for breeding purposes. The automated operation of the system has allowed the classification of more than one million onion bulbs during seasons 2004 to 2008 (Table 1). The performance achieved by the original model (R2=0,65; SEC=2,28ºBrix) was enough for qualitative classification thanks to the broad range of variation of the initial population (18ºBrix). Nevertheless, a reduction of the classification performance of the model has been observed with the passing of seasons. One of the reasons put forward is the reduction of the range of variation that naturally occurs during a breeding process, the other is the variations in other parameters than the variable of interest but whose effects would probably be affecting the measurements [1]. This study points to the application of Independent Component Analysis (ICA) on this highly variable dataset coming from a NIRS industrial application for the identification of the different sources of variation present through seasons

    Automatic de-noising of close-range hyperspectral images with a wavelength-specific shearlet-based image noise reduction method

    Get PDF
    Hyperspectral imaging (HSI) has become an essential tool for exploration of different spatially-resolved properties of materials in analytical chemistry. However, due to various technical factors such as detector sensitivity, choice of light source and experimental conditions, the recorded data contain noise. The presence of noise in the data limits the potential of different data processing tasks such as classification and can even make them ineffective. Therefore, reduction/removal of noise from the data is a useful step to improve the data modelling. In the present work, the potential of a wavelength-specific shearlet-based image noise reduction method was utilised for automatic de-noising of close-range HS images. The shearlet transform is a special type of composite wavelet transform that utilises the shearing properties of the images. The method first utilises the spectral correlation between wavelengths to distinguish between levels of noise present in different image planes of the data cube. Based on the level of noise present, the method adapts the use of the 2-D non-subsampled shearlet transform (NSST) coefficients obtained from each image plane to perform the spatial and spectral de-noising. Furthermore, the method was compared with two commonly used pixel-based spectral de-noising techniques, Savitzky-Golay (SAVGOL) smoothing and median filtering. The methods were compared using simulated data, with Gaussian and Gaussian and spike noise added, and real HSI data. As an application, the methods were tested to determine the efficacy of a visible-near infrared (VNIR) HSI camera to perform non-destructive automatic classification of six commercial tea products. De-noising with the shearlet-based method resulted in a visual improvement in the quality of the noisy image planes and the spectra of simulated and real HSI. The spectral correlation was highest with the shearlet-based method. The peak signal-to-noise ratio (PSNR) obtained using the shearlet-based method was higher than that for SAVGOL smoothing and median filtering. There was a clear improvement in the classification accuracy of the SVM models for both the simulated and real HSI data that had been de-noised using the shearlet-based method. The method presented is a promising technique for automatic de-noising of close-range HS images, especially when the amount of noise present is high and in consecutive wavelengths

    Policy Brief MIX AND MATCH: WHAT PRINCIPALS REALLY LOOK FOR WHEN HIRING TEACHERS

    Get PDF
    Abstract The vast majority of research and policy related to teacher quality focuses on the supply of teachers and ignores teacher demand. In particular, the important role of school principals in hiring teachers is rarely considered. Using interviews of school principals in a midsized Florida school district, we provide an exploratory mixed methods analysis of the teacher characteristics principals prefer. Our findings contradict the conventional wisdom that principals undervalue content knowledge and intelligence. Principals in our study ranked content knowledge third among a list of twelve characteristics. Intelligence does appear less important at first glance, but this is apparently because principals believe all applicants who meet certification requirements meet a minimum threshold on intelligence and because some intelligent teachers have difficulty connecting with students. More generally, we find that principals prefer an "individual mix" of personal and professional qualities. They also create an "organizational mix," hiring teachers who differ from those already in the school in terms of race, gender, experience, and skills, and an "organizational match," in which teachers have similar work habits and a high propensity to remain with the school over time. Because of tenure rules, many principals also prefer less experienced (untenured) teachers, even though research suggests that they are less effective

    Application of Independent Components Analysis with the JADE algorithm and NIR hyperspectral imaging for revealing food adulteration

    Get PDF
    In recent years, Independent Components Analysis (ICA) has proven itself to be a powerful signal-processing technique for solving the Blind-Source Separation (BSS) problems in different scientific domains. In the present work, an application of ICA for processing NIR hyperspectral images to detect traces of peanut in wheat flour is presented. Processing was performed without a priori knowledge of the chemical composition of the two food materials. The aim was to extract the source signals of the different chemical components from the initial data set and to use them in order to determine the distribution of peanut traces in the hyperspectral images. To determine the optimal number of independent component to be extracted, the Random ICA by blocks method was used. This method is based on the repeated calculation of several models using an increasing number of independent components after randomly segmenting the matrix data into two blocks and then calculating the correlations between the signals extracted from the two blocks. The extracted ICA signals were interpreted and their ability to classify peanut and wheat flour was studied. Finally, all the extracted ICs were used to construct a single synthetic signal that could be used directly with the hyperspectral images to enhance the contrast between the peanut and the wheat flours in a real multi-use industrial environment. Furthermore, feature extraction methods (connected components labelling algorithm followed by flood fill method to extract object contours) were applied in order to target the spatial location of the presence of peanut traces. A good visualization of the distributions of peanut traces was thus obtaine

    Recent trends in multi-block data analysis in chemometrics for multi-source data integration

    Get PDF
    In recent years, multi-modal measurements of process and product properties have become widely popular. Sometimes classical chemometric methods such as principal component analysis (PCA) and partial least squares regression (PLS) are not adequate to analyze this kind of data. In recent years, several multi-block methods have emerged for this purpose; however, their use is largely limited to chemometricians, and non-experts have little experience with such methods. In order to deal with this, the present review provides a brief overview of the multi-block data analysis concept, the various tasks that can be performed with it and the advantages and disadvantages of different techniques. Moreover, basic tasks ranging from multi-block data visualization to advanced innovative applications such as calibration transfer will be briefly highlighted. Finally, a summary of software resources available for multi-block data analysis is provided

    Modern tests of Lorentz invariance

    Get PDF
    Motivated by ideas about quantum gravity, a tremendous amount of effort over the past decade has gone into testing Lorentz invariance in various regimes. This review summarizes both the theoretical frameworks for tests of Lorentz invariance and experimental advances that have made new high precision tests possible. The current constraints on Lorentz violating effects from both terrestrial experiments and astrophysical observations are presented.Comment: Modified and expanded discussions of various points. Numerous references added. Version matches that accepted by Living Reviews in Relativit

    Quantum Spacetime Phenomenology

    Get PDF
    I review the current status of phenomenological programs inspired by quantum-spacetime research. I stress in particular the significance of results establishing that certain data analyses provide sensitivity to effects introduced genuinely at the Planck scale. And my main focus is on phenomenological programs that managed to affect the directions taken by studies of quantum-spacetime theories.Comment: 125 pages, LaTex. This V2 is updated and more detailed than the V1, particularly for quantum-spacetime phenomenology. The main text of this V2 is about 25% more than the main text of the V1. Reference list roughly double
    corecore